DecAug: Out-of-Distribution Generalization via Decomposed Feature Representation and Semantic Augmentation

نویسندگان

چکیده

While deep learning demonstrates its strong ability to handle independent and identically distributed (IID) data, it often suffers from out-of-distribution (OoD) generalization, where the test data come another distribution (w.r.t. training one). Designing a general OoD generalization framework for wide range of applications is challenging, mainly due different kinds shifts in real world, such as shift across domains or extrapolation correlation. Most previous approaches can only solve one specific shift, leading unsatisfactory performance when applied various benchmarks. In this work, we propose DecAug, novel decomposed feature representation semantic augmentation approach generalization. Specifically, DecAug disentangles category-related context-related features by orthogonalizing two gradients intermediate features) losses predicting category context labels, contain causal information target object, while cause between data. Furthermore, perform gradient-based on improve robustness learned representations. Experimental results show that outperforms other state-of-the-art methods datasets, which among very few deal with types challenges.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Domain Generalization via Invariant Feature Representation

This paper investigates domain generalization: How to take knowledge acquired from an arbitrary number of related domains and apply it to previously unseen domains? We propose Domain-Invariant Component Analysis (DICA), a kernel-based optimization algorithm that learns an invariant transformation by minimizing the dissimilarity across domains, whilst preserving the functional relationship betwe...

متن کامل

A Model of Concept Generalization and Feature Representation in Hierarchies

We present a novel modeling framework for representing category exemplars and features. This approach treats each category exemplar as a probability distribution over a hierarchically structured graph. The model jointly learns the mixture of each exemplar across categories in the graph, and a feature representation for each node in the graph, including nodes for which no data is directly observ...

متن کامل

Semantic Feature Representation to Capture News Impact

Mining natural language text for real world applications can benefit from a data representation that encodes semantic structure. Tree-based features, however, limit the available learning methods. This paper presents a study where semantic frames are used to mine financial news so as to quantify the impact of news on the stock market. We represent news documents in a novel semantic tree structu...

متن کامل

Semantic Feature Representation to Capture News Impact

This paper presents a study where semantic frames are used to mine financial news so as to quantify the impact of news on the stock market. We represent news documents in a novel semantic tree structure and use tree kernel support vector machines to predict the change of stock price. We achieve an efficient computation through linearization of tree kernels. In addition to two binary classificat...

متن کامل

Generalization of word retrieval following semantic feature treatment.

OBJECTIVES The effectiveness of a Semantic Feature Treatment (SFT) at increasing word retrieval accuracy of untreated words was examined in relation to the influence of the number of shared features with treated words. Generalization of these improvements to discourse was also examined. METHODS Three adults with chronic aphasia completed 12 SFT sessions. Generalization to untreated words with...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Proceedings of the ... AAAI Conference on Artificial Intelligence

سال: 2021

ISSN: ['2159-5399', '2374-3468']

DOI: https://doi.org/10.1609/aaai.v35i8.16829